Data and Democracy
The DPDI Bill Threatens the Integrity of the UK General Election
In our latest briefing, Open Rights Group raises the alarm over changes in the Data Protection and Digital Information Bill (DPDI Bill) that could open the floodgate for the abuse of data analytics and new technologies for electoral manipulation.
Information is power, and new technologies amplify governments and corporations’ ability to collect, use and weaponise personal data. Thus, data protection is an important first-line of defence against practices that could mislead voters, manipulate public opinion, and threaten the integrity of the electoral process. However, the DPDI Bill would change UK data protection laws, making it easier for the government, private companies and political parties to spy, collect and use information about you, but more difficult to investigate, unveil and hold to account those who may try to weaponise data analytics for political purposes.
The Bill has the potential to undermine legal standards that protect all of us from the ever-growing threats to the integrity of elections arising from technological developments — including the use of data analytics and micro-targeting. With a general election on the horizon, this poses an immediate risk. We are calling on the House of Lords to oppose dangerous proposals in the DPDI Bill that would:
- Reduce transparency over how personal data is used
- Multiply avenues and caveats political parties can rely on to profile individuals against their will
- Make it more difficult to scrutinise data uses and hold law-breaker to account
- Reduce access to redress and the independence of the Information Commissioner’s Office (ICO)
- Increase risks that political parties engage in novel, and exploitative methods to obtain your personal data.
Why data protection matters in politics
Modern technologies allow the collection of vast amount of personal data that can be used to guess or infer individuals’ personal opinions. Since 2016, the use of data analytics for electoral campaigning has steadily increased. For instance, in our 2020 “Who do they think we are” report we found that political parties were combining commercial data and information bought by data brokers with electoral rolls, to build detailed profiles of individuals and target them with “tailored” political messaging, also known as “micro-targeting”.
Our report unveiled that these profiles are usually wrong and factually inaccurate, but this does not make this practice any less dangerous: in the Cambridge Analytica scandal, political actors already exploited this system to target individuals’ with different electoral messages, raising concerns over the ability of such systems to manipulate public opinion. The investigation that followed this scandal unveiled that political parties in the UK had been using electoral register data to draw detailed profiles about UK voters’ lifestyle, ethnicity and age, raising several concerns around the legality, transparency, fairness and accuracy of these depictions.
More recently, it has been suggested that in the Rochdale by-election, George Galloway MP could have targeted Muslim voters with different canvassing information — something that, if confirmed, would have provided his constituents only a partial view of the topics and issues he campaigned for. It has also been revealed that Chinese-backed actors breached the UK electoral register – raising questions about foreign interferences in the upcoming UK electoral process. In the US, this kind of political targeting has been used target Black voters with robocalls that would provide them wrong information about how to register to vote, or wrong voting information about how to vote such as suggesting the wrong date.
Micro-targeting threatens the integrity of the electoral process in at least two different ways. On the one hand, it enables political candidates to be two-faced and make different electoral promises to different political audiences, in a way that lacks transparency and leaves the public unaware of this double game. On the other hand, it allows bad-faith actors to target constituents with the wrong information which can trick voters into making mistakes during the electoral process, such as by missing the registration deadline or showing up at the poll station on the wrong day.
The DPDI Bill: wrong answers only
The UK Government are proposing changes to UK data protection laws that would lower legal standards, weaken data protection rights, reduce accountability, and undermine the independence of the ICO, the UK data protection authority that oversees, among other things, the use of personal data by political parties. These proposals have made a lot of people very angry and have been widely regarded as a bad move: as our briefing shows, this is no less true in the context of data analytics for political purposes.
Indeed, the Government is proposing to lower accountability standards and record-keeping requirements, as well as to make it more difficult for UK voters to exercise their right to access their personal data held by political parties and any other organisation. None of these changes will help provide a defence against the growing weaponisation of data analytics for political purposes. Records and other accountability documentation has proven to be an invaluable tool for journalists and public interest organisations who investigate and unveil malpractice. Likewise, the Cambridge Analytica investigations, or our own report on political profiling, would have never been possible without the right to access personal data, which enabled people to ask political parties or consultancies firms about what data they held about UK voters and for what purposes.
Furthermore, the DPDI Bill would also introduce a very wide, vague and unconditional basis for data processing for “democratic engagement” purposes, and allow online tracking and other high-risk profiling of our browsing habits without our prior consent. It would also give Ministers the power to appoint loyalists of their choice to the board of the ICO, or to dissuade it from investigating matters that the Government does not want to be investigated.
We need another Bill
The UK data protection reform is currently being scrutinised by the Lords, whose Constitutional committee has raised an eyebrow, among other things, on the new “democratic engagement” legal basis. At the same time, Open Rights Group has supported peers in tabling amendments that would foster the independence of the ICO, and several other civil society players have helped in drafting amendments that would restore high legal standards, protect the right to access personal data, and retain effective accountability requirements.
All this underscore a deeper issue: the DPDI Bill is a collection of bad proposals the Government formulated after a lopsided consultation process, a stubborn determination to ignore the opinions of independent experts and public interest organisations and, by last, the decision to table 150 pages of amendments a few days before their discussion in the House of Commons. In turn, Members of Parliament have been denied a fair chance to scrutinise Government proposals, and the Lords are now left with the unpleasant job of cleaning up this mess.